130 research outputs found
Adaptive Scales of Spatial Integration and Response Latencies in a Critically-Balanced Model of the Primary Visual Cortex
The brain processes visual inputs having structure over a large range of
spatial scales. The precise mechanisms or algorithms used by the brain to
achieve this feat are largely unknown and an open problem in visual
neuroscience. In particular, the spatial extent in visual space over which
primary visual cortex (V1) performs evidence integration has been shown to
change as a function of contrast and other visual parameters, thus adapting
scale in visual space in an input-dependent manner. We demonstrate that a
simple dynamical mechanism---dynamical criticality---can simultaneously account
for the well-documented input-dependence characteristics of three properties of
V1: scales of integration in visuotopic space, extents of lateral integration
on the cortical surface, and response latencies
Convolutional unitary or orthogonal recurrent neural networks
Recurrent neural networks are extremely powerful yet hard to train. One of
their issues is the vanishing gradient problem, whereby propagation of training
signals may be exponentially attenuated, freezing training. Use of orthogonal
or unitary matrices, whose powers neither explode nor decay, has been proposed
to mitigate this issue, but their computational expense has hindered their use.
Here we show that in the specific case of convolutional RNNs, we can define a
convolutional exponential and that this operation transforms antisymmetric or
anti-Hermitian convolution kernels into orthogonal or unitary convolution
kernels. We explicitly derive FFT-based algorithms to compute the kernels and
their derivatives. The computational complexity of parametrizing this subspace
of orthogonal transformations is thus the same as the networks' iteration
Dynamical and Statistical Criticality in a Model of Neural Tissue
For the nervous system to work at all, a delicate balance of excitation and
inhibition must be achieved. However, when such a balance is sought by global
strategies, only few modes remain balanced close to instability, and all other
modes are strongly stable. Here we present a simple model of neural tissue in
which this balance is sought locally by neurons following `anti-Hebbian'
behavior: {\sl all} degrees of freedom achieve a close balance of excitation
and inhibition and become "critical" in the dynamical sense. At long
timescales, the modes of our model oscillate around the instability line, so an
extremely complex "breakout" dynamics ensues in which different modes of the
system oscillate between prominence and extinction. We show the system develops
various anomalous statistical behaviours and hence becomes self-organized
critical in the statistical sense
Noise-induced memory in extended excitable systems
We describe a form of memory exhibited by extended excitable systems driven
by stochastic fluctuations. Under such conditions, the system self-organizes
into a state characterized by power-law correlations thus retaining long-term
memory of previous states. The exponents are robust and model-independent. We
discuss novel implications of these results for the functioning of cortical
neurons as well as for networks of neurons.Comment: 4 pages, latex + 5 eps figure
- …